arty

Assistive Robotic Transport for Youngsters (ARTY)

ARTY is a smart paediatric wheelchair designed with a focus on safety and with the primary objective of providing an independent lifestyle for disabled children, a training tool for therapists and a experimental platform for scientists.

ARTY is composed of an Ottobock children’s powered wheelchair named Skippi, a powered wheelchair that fulfills some of the key features that an ideal children’s wheelchair should posses according to the National Association of Paediatric Occupational Therapists (NAPOT), i.e. "fun" appearance, ability to be a training aid in early stages, options to accommodate different seating systems as well as a variety of control options (e.g., joystick, switches, head array, touch path, sip and puff, voice, among others). Skippi is colorful, easily transportable, with adjustable seats, relatively lightweight, with an autonomy of more than a day with a single charge and not prohibitively expensive. Furthermore, Skippi uses a Controller-area network ( CAN ) based electronic system. ARTY also has three Hokuyo URG-04LX laser scanners, a tablet PC as a touch interface to change the wheelchair’s basic settings and a wireless router.

There are several software components related with ARTY , all of them them written ontop of the ROS, all these software components (or nodes) can be classified in four different areas based on the type of task they perform:

1. Wheelchair control and navigation

The following are nodes related with ARTY that performs tasks in the control and navigation areas:

  • Motor access/control (MAC) and joystick reader node: responsible of the communication with Skippi, they perform the necessary translation from CAN messages to command velocities and vice-versa. In addition, the MAC node provides the system with odometry informations, useful for the mapping and localization tasks.
  • User interface node: it is shown in the tablet PC, enables the user to turn the wheelchair on and off and to change some basic settings like the maximum translational and rotational velocities.
  • Shared control node - Obstacle Avoidance: this module moderates the input joystick signal according to the wheelchair proximity to obstacles, in order to avoid collisions, using a Hybrid shared-control (HSC) method that combines the Combined Vector Field (CVF) and Dynamic Window Approach (DWA) algorithms, commonly used in robot navigation. This node provides three different shared-control levels: basic (no modulation), safeguarding (CVF is not used, instead, the scaled command velocities are generated directly from the user’s control velocities) and assisted control (HSC is applied).

Take a look at Soh and Demiris, 2012 for more information about these nodes.

2. Localization and mapping

The following are the software components related with localization and mapping tasks:

  • Laser Combiner: takes the readings of the three laser rangers and combines them into a single message to obtain an coherent obstacle map.
  • Laser Scan Matcher1 : takes a message from Laser Combiner and interpolates the wheelchair odometry using an iterative closest point algorithm.
  • Adaptive Monte Carlo Localization (AMCL)2 : receives messages from Laser Combiner and Laser Scan Matcher to localize the wheelchair on a pre-existing map.
  • Obstacle Avoidance: this module moderates the input joystick signal according to the wheelchair proximity to obstacles using a DWA to avoid collisions.

Take a look at Soh and Demiris, 2012 for more information about these nodes. 

3. Communication with the user

These nodes are in charge of communicating wheelchair alerts to the user:

  • Nao Director: this node coordinates the movements and speech coming from Nao. It instructs Nao to execute wake-up and power-off animations and provides a random behavior (randomly looking around while blinking) for the humanoid robot. Nao Director receives messages from Navigation Reporter and Obstacle Reporter nodes, and stops the random behavior when it receives an alert from one of these two nodes to indicate the direction the user must follow or the location of the obstacle. This node its also able to command an on-screen simulated Nao.
  • Wayfinder Software: It’s a more traditional driving aid. It receives messages from Navigation Reporter and shows an arrow during three seconds on a computer window that indicates the direction the user must follow when the node receives an alert.
  • Computer Voice: It’s a more traditional driving aid. It receives messages from Navigation Reporter and speaks at loud the instructions received from this node.
  • Navigation Reporter: receives a message from AMCL and raises an alert whenever the user has deviated from a pre-recorded path on the map and indicates the direction  that the user should follow at the junctions of the said path (ignores similar alerts received by Obstacle Reporter if they occur in a short period of time).
  • Obstacle Reporter: receives messages from AMCL and Obstacle avoidance nodes and raises an alert if the user is driving towards an obstacle (ignores similar alerts received by Navigation Reporter if they occur in a short period of time).

Take a look at Sarabia and Demiris, 2013 for more information about these nodes. 

4. Miscellaneous

This last group of software components performs tasks of various types: 

  • Asteroids Game: was developed as a secondary task for path planning experiments, the objective in the game is to move an spaceship away from the incoming asteroids using the up and down arrows on a keyboard.
  • Nao Simulator: receives messages from Nao Director. It’s basically an on screen simulated Nao.

Take a look at Sarabia and Demiris, 2013 for more information about these nodes. 


1. Node written by Ivan Dryanovski, William Morris and Andrea Censi, available at http://wiki.ros.org/laser_scan_matcher
2. Node written by Brian P. Gerkey and Andrew Howard, available at http://wiki.ros.org/amcl